AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Continuous Pretraining

# Continuous Pretraining

Consilience 40b J4iA6BRm
Nous Consilience 40B is a 40-billion-parameter generative text model pretrained from scratch in a decentralized manner, supporting multiple languages and designed to represent the broad spectrum of human creative output.
Large Language Model Supports Multiple Languages
C
PsycheFoundation
46
1
K 12BERT
Apache-2.0
K-12BERT is a BERT model obtained through continuous pretraining on K-12 basic education data, optimized specifically for educational scenarios
Large Language Model Transformers English
K
vasugoel
50
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase